Subdifferential Rolle’s and Mean Value Inequality Theorems
نویسندگان
چکیده
In this note we give a subdifferential mean value inequality for every continuous Gâteaux subdifferentiable function f in a Banach space which only requires a bound for one but not necessarily all of the subgradients of f at every point of its domain. We also give a subdifferential approximate Rolle’s theorem stating that if a subdifferentiable function oscillates between −ε and ε on the boundary of the unit ball then there exists a subgradient of the function at an interior point of the ball which has norm less or equal than 2ε.
منابع مشابه
Subdifferential characterization of approximate convexity: the lower semicontinuous case
Throughout, X stands for a real Banach space, SX for its unit sphere, X ∗ for its topological dual, and 〈·, ·〉 for the duality pairing. All the functions f : X → R∪{+∞} are lower semicontinuous. The Clarke subdifferential , the Hadamard subdifferential and the Fréchet subdifferential of f are respectively denoted by ∂Cf , ∂Hf and ∂F f . The Zagrodny two points mean value inequality has proved t...
متن کاملThe diamond-alpha Riemann integral and mean value theorems on time scales
We study diamond-alpha integrals on time scales. A diamond-alpha version of Fermat’s theorem for stationary points is also proved, as well as Rolle’s, Lagrange’s, and Cauchy’s mean value theorems on time scales. Mathematics Subject Classification 2000: 26A42; 39A12.
متن کاملVector Optimization Problems and Generalized Vector Variational-Like Inequalities
In this paper, some properties of pseudoinvex functions, defined by means of limiting subdifferential, are discussed. Furthermore, the Minty vector variational-like inequality, the Stampacchia vector variational-like inequality, and the weak formulations of these two inequalities defined by means of limiting subdifferential are studied. Moreover, some relationships between the vector vari...
متن کاملRelating lexicographic smoothness and directed subdifferentiability
Lexicographic derivatives developed by Nesterov and directed subdifferentials developed by Baier, Farkhi, and Roshchina are both essentially nonconvex generalized derivatives for nonsmooth nonconvex functions and satisfy strict calculus rules and mean-value theorems. This article aims to clarify the relationship between the two generalized derivatives. In particular, for scalar-valued functions...
متن کاملSensitivity Analysis of the Value Function for Optimization Problems with Variational Inequality Constraints
In this paper we perform sensitivity analysis for optimization problems with variational inequality constraints (OPVICs). We provide upper estimates for the limiting subdifferential (singular limiting subdifferential) of the value function in terms of the set of normal (abnormal) coderivative (CD) multipliers for OPVICs. For the case of optimization problems with complementarity constraints (OP...
متن کامل